Neural Networks: Algorithms and Special Architectures

نویسندگان

  • Bharat Bhushan
  • Madhusudan Singh
چکیده

The paper is focused on neural networks, their learning algorithms, special architecture and SVM. General learning rule as a function of the incoming signals is discussed. Other learning rules such as Hebbian learning, delta learning, perceptron learning, Least Mean Square (LMS) learning, Winner Take All (WTA) learning are presented as a derivation of the general learning rule. Architecture specific learning algorithms for cascade correlation networks, functional link networks, counterpropagation networks and Radial Basis Function (RBF) networks are described. Case study has been done for neural network based identification and control for temperature control of a water bath. Support Vector Machines (SVMs) an extremely powerful method of deriving efficient models for multidimensional function approximation and classification have been discussed. Introduction Artificial neural networks are systems that are deliberately constructed to make use of some organizational principles resembling those of the human brain. They represent the promising new generation of information processing systems. Artificial neural networks (ANNs) have a large number of highly interconnected processing elements (nodes or units) that usually operate in parallel and are configured in regular architectures. The collective behavior of an ANN, like a human brain, demonstrates the ability to learn, recall, and generalize from training patterns or data. ANNs are inspired by modeling network of real (biological) neurons in the brain. A survey [1],[2],[3],[4],[5],[6],[7],[8],[9],[10],[11],[12],[13] on neural networks learning methods and special feedback architectures have been done. Single-layer and multilayer feedforward networks and their associated supervised learning rules have 176 Bharat Bhushan and Madhusudan Singh been described. Case study on temperature control of water bath using feedforward NN have been done. Various learning algorithms like Hebbian learning rule, Correlation learning rule, Instar learning rule, Winner Takes All, Outstar learning rule, Perceptron learning rule, Widrow-Hoff learning rule, Delta learning rule and Error Backpropagation learning are discussed. Special feedforward architectures like Functional link networks, feedforward version of the counterpropagation network, LVQ learning vector quantization, WTA architecture, Cascade Correlation architecture and RBF-Radial basis function networks have been discussed. Both the multiplayer perceptron and the radial basis network are based on the popular learning paradigm of error-correction learning. The synaptic weights of these networks are adjusted to reduce the difference (error) between the desired target value and corresponding output. SVM offers an extremely powerful method of deriving efficient models for multidimensional function approximation and classification. Neural Networks The feedforward neural networks allow only for one directional signal flow. Furthermore, most of feedforward neural networks are organized in layers. An example of the three-layer feedforward neural network is shown in Figure1. IN P U T S H I D D E N L A Y E R S O U T P U T L A Y E R

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Prediction of Gain in LD-CELP Using Hybrid Genetic/PSO-Neural Models

In this paper, the gain in LD-CELP speech coding algorithm is predicted using three neural models, that are equipped by genetic and particle swarm optimization (PSO) algorithms to optimize the structure and parameters of neural networks. Elman, multi-layer perceptron (MLP) and fuzzy ARTMAP are the candidate neural models. The optimized number of nodes in the first and second hidden layers of El...

متن کامل

Prediction of Gain in LD-CELP Using Hybrid Genetic/PSO-Neural Models

In this paper, the gain in LD-CELP speech coding algorithm is predicted using three neural models, that are equipped by genetic and particle swarm optimization (PSO) algorithms to optimize the structure and parameters of neural networks. Elman, multi-layer perceptron (MLP) and fuzzy ARTMAP are the candidate neural models. The optimized number of nodes in the first and second hidden layers of El...

متن کامل

Neural Networks in Electric Load Forecasting:A Comprehensive Survey

Review and classification of electric load forecasting (LF) techniques based on artificial neuralnetworks (ANN) is presented. A basic ANNs architectures used in LF reviewed. A wide range of ANNoriented applications for forecasting are given in the literature. These are classified into five groups:(1) ANNs in short-term LF, (2) ANNs in mid-term LF, (3) ANNs in long-term LF, (4) Hybrid ANNs inLF,...

متن کامل

Reinforcement Learning in Neural Networks: A Survey

In recent years, researches on reinforcement learning (RL) have focused on bridging the gap between adaptive optimal control and bio-inspired learning techniques. Neural network reinforcement learning (NNRL) is among the most popular algorithms in the RL framework. The advantage of using neural networks enables the RL to search for optimal policies more efficiently in several real-life applicat...

متن کامل

Reinforcement Learning in Neural Networks: A Survey

In recent years, researches on reinforcement learning (RL) have focused on bridging the gap between adaptive optimal control and bio-inspired learning techniques. Neural network reinforcement learning (NNRL) is among the most popular algorithms in the RL framework. The advantage of using neural networks enables the RL to search for optimal policies more efficiently in several real-life applicat...

متن کامل

Comparison of Genetic and Hill Climbing Algorithms to Improve an Artificial Neural Networks Model for Water Consumption Prediction

No unique method has been so far specified for determining the number of neurons in hidden layers of Multi-Layer Perceptron (MLP) neural networks used for prediction. The present research is intended to optimize the number of neurons using two meta-heuristic procedures namely genetic and hill climbing algorithms. The data used in the present research for prediction are consumption data of water...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010